37 research outputs found

    High Dimensional Consistent Digital Segments

    Get PDF
    We consider the problem of digitalizing Euclidean line segments from R^d to Z^d. Christ {et al.} (DCG, 2012) showed how to construct a set of {consistent digital segments} (CDS) for d=2: a collection of segments connecting any two points in Z^2 that satisfies the natural extension of the Euclidean axioms to Z^d. In this paper we study the construction of CDSs in higher dimensions. We show that any total order can be used to create a set of {consistent digital rays} CDR in Z^d (a set of rays emanating from a fixed point p that satisfies the extension of the Euclidean axioms). We fully characterize for which total orders the construction holds and study their Hausdorff distance, which in particular positively answers the question posed by Christ {et al.}

    Computational Complexity of the ?-Ham-Sandwich Problem

    Get PDF
    ?_d from each set. Steiger and Zhao [DCG 2010] proved a discrete analogue of this theorem, which we call the ?-Ham-Sandwich theorem. They gave an algorithm to find the hyperplane in time O(n (log n)^{d-3}), where n is the total number of input points. The computational complexity of this search problem in high dimensions is open, quite unlike the complexity of the Ham-Sandwich problem, which is now known to be PPA-complete (Filos-Ratsikas and Goldberg [STOC 2019]). Recently, Fearnley, Gordon, Mehta, and Savani [ICALP 2019] introduced a new sub-class of CLS (Continuous Local Search) called Unique End-of-Potential Line (UEOPL). This class captures problems in CLS that have unique solutions. We show that for the ?-Ham-Sandwich theorem, the search problem of finding the dividing hyperplane lies in UEOPL. This gives the first non-trivial containment of the problem in a complexity class and places it in the company of classic search problems such as finding the fixed point of a contraction map, the unique sink orientation problem and the P-matrix linear complementarity problem

    A Generalization of Self-Improving Algorithms

    Get PDF
    Ailon et al. [SICOMP'11] proposed self-improving algorithms for sorting and Delaunay triangulation (DT) when the input instances x1,,xnx_1,\cdots,x_n follow some unknown \emph{product distribution}. That is, xix_i comes from a fixed unknown distribution Di\mathsf{D}_i, and the xix_i's are drawn independently. After spending O(n1+ε)O(n^{1+\varepsilon}) time in a learning phase, the subsequent expected running time is O((n+H)/ε)O((n+ H)/\varepsilon), where H{HS,HDT}H \in \{H_\mathrm{S},H_\mathrm{DT}\}, and HSH_\mathrm{S} and HDTH_\mathrm{DT} are the entropies of the distributions of the sorting and DT output, respectively. In this paper, we allow dependence among the xix_i's under the \emph{group product distribution}. There is a hidden partition of [1,n][1,n] into groups; the xix_i's in the kk-th group are fixed unknown functions of the same hidden variable uku_k; and the uku_k's are drawn from an unknown product distribution. We describe self-improving algorithms for sorting and DT under this model when the functions that map uku_k to xix_i's are well-behaved. After an O(poly(n))O(\mathrm{poly}(n))-time training phase, we achieve O(n+HS)O(n + H_\mathrm{S}) and O(nα(n)+HDT)O(n\alpha(n) + H_\mathrm{DT}) expected running times for sorting and DT, respectively, where α()\alpha(\cdot) is the inverse Ackermann function

    Distance Bounds for High Dimensional Consistent Digital Rays and 2-D Partially-Consistent Digital Rays

    Get PDF
    We consider the problem of digitalizing Euclidean segments. Specifically, we look for a constructive method to connect any two points in Zd. The construction must be consistent (that is, satisfy the natural extension of the Euclidean axioms) while resembling them as much as possible. Previous work has shown asymptotically tight results in two dimensions with Θ(logN) error, where resemblance between segments is measured with the Hausdorff distance, and N is the L1 distance between the two points. This construction was considered tight because of a Ω(logN) lower bound that applies to any consistent construction in Z2. In this paper we observe that the lower bound does not directly extend to higher dimensions. We give an alternative argument showing that any consistent construction in d dimensions must have Ω(log1/(d−1)N) error. We tie the error of a consistent construction in high dimensions to the error of similar weak constructions in two dimensions (constructions for which some points need not satisfy all the axioms). This not only opens the possibility for having constructions with o(logN) error in high dimensions, but also opens up an interesting line of research in the tradeoff between the number of axiom violations and the error of the construction. A side result, that we find of independent interest, is the introduction of the bichromatic discrepancy: a natural extension of the concept of discrepancy of a set of points. In this paper, we define this concept and extend known results to the chromatic setting

    Distance Bounds for High Dimensional Consistent Digital Rays and 2-D Partially-Consistent Digital Rays

    Get PDF
    We consider the problem of digitalizing Euclidean segments. Specifically, we look for a constructive method to connect any two points in Zd\mathbb{Z}^d. The construction must be {\em consistent} (that is, satisfy the natural extension of the Euclidean axioms) while resembling them as much as possible. Previous work has shown asymptotically tight results in two dimensions with Θ(logN)\Theta(\log N) error, where resemblance between segments is measured with the Hausdorff distance, and NN is the L1L_1 distance between the two points. This construction was considered tight because of a Ω(logN)\Omega(\log N) lower bound that applies to any consistent construction in Z2\mathbb{Z}^2. In this paper we observe that the lower bound does not directly extend to higher dimensions. We give an alternative argument showing that any consistent construction in dd dimensions must have Ω(log1/(d1)N)\Omega(\log^{1/(d-1)} N) error. We tie the error of a consistent construction in high dimensions to the error of similar {\em weak} constructions in two dimensions (constructions for which some points need not satisfy all the axioms). This not only opens the possibility for having constructions with o(logN)o(\log N) error in high dimensions, but also opens up an interesting line of research in the tradeoff between the number of axiom violations and the error of the construction. In order to show our lower bound, we also consider a colored variation of the concept of discrepancy of a set of points that we find of independent interest

    Rectilinear Link Diameter and Radius in a Rectilinear Polygonal Domain

    Get PDF
    We study the computation of the diameter and radius under the rectilinear link distance within a rectilinear polygonal domain of nn vertices and hh holes. We introduce a \emph{graph of oriented distances} to encode the distance between pairs of points of the domain. This helps us transform the problem so that we can search through the candidates more efficiently. Our algorithm computes both the diameter and the radius in min{O(nω),O(n2+nhlogh+χ2)}\min \{\,O(n^\omega), O(n^2 + nh \log h + \chi^2)\,\} time, where ω<2.373\omega<2.373 denotes the matrix multiplication exponent and χΩ(n)O(n2)\chi\in \Omega(n)\cap O(n^2) is the number of edges of the graph of oriented distances. We also provide a faster algorithm for computing the diameter that runs in O(n2logn)O(n^2 \log n) time
    corecore